On the Conditions of Sparse Parameter Estimation via Log-Sum Penalty Regularization

نویسندگان

  • Zheng Pan
  • Guangdong Hou
  • Changshui Zhang
چکیده

For high-dimensional sparse parameter estimation problems, Log-Sum Penalty (LSP) regularization effectively reduces the sampling sizes in practice. However, it still lacks theoretical analysis to support the experience from previous empirical study. The analysis of this article shows that, like `0-regularization, O(s) sampling size is enough for proper LSP, where s is the non-zero components of the true parameter. We also propose an efficient algorithm to solve LSP regularization problem. The solutions given by the proposed algorithm give consistent parameter estimations under less restrictive conditions than `1-regularization.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Large-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation

In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...

متن کامل

Sparsity regularization for image reconstruction with Poisson data

This work investigates three penalized-likelihood expectation maximization (EM) algorithms for image reconstruction with Poisson data where the images are known a priori to be sparse in the space domain. The penalty functions considered are the 1 norm, the 0 “norm,” and a penalty function based on the sum of logarithms of pixel values, R(x) = ∑np j=1 log (xj δ + 1 ) . Our results show that the ...

متن کامل

High dimensional thresholded regression and shrinkage effect

High dimensional sparse modelling via regularization provides a powerful tool for analysing large-scale data sets and obtaining meaningful interpretable models.The use of nonconvex penalty functions shows advantage in selecting important features in high dimensions, but the global optimality of such methods still demands more understanding.We consider sparse regression with a hard thresholding ...

متن کامل

The Log-Shift Penalty for Adaptive Estimation of Multiple Gaussian Graphical Models

Sparse Gaussian graphical models characterize sparse dependence relationships between random variables in a network. To estimate multiple related Gaussian graphical models on the same set of variables, we formulate a hierarchical model, which leads to an optimization problem with a nonconvex log-shift penalty function. We show that under mild conditions the optimization problem is convex despit...

متن کامل

Some Sharp Performance Bounds for Least Squares Regression with L1 Regularization

We derive sharp performance bounds for least squares regression with L1 regularization from parameter estimation accuracy and feature selection quality perspectives. The main result proved for L1 regularization extends a similar result in [Ann. Statist. 35 (2007) 2313–2351] for the Dantzig selector. It gives an affirmative answer to an open question in [Ann. Statist. 35 (2007) 2358–2364]. Moreo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1308.6504  شماره 

صفحات  -

تاریخ انتشار 2013